Dance to the Beat : Enhancing Dancing Performance in Video
Author
Yanir Kleiman, Daniel Cohen-Or
Abstract
In this paper we introduce a video postprocessing method that enhances the rhythm of a dancing performance, in the sense that the dancing movements are more cohesive with the beat of the music. The dancing performance as observed in a video is analyzed and segmented into motion intervals delimited by motion-beats. We present an image-space method to extract the motion-beats of a video by detecting frames at which there is a significant change in direction or motion stops. The motion-beats are then synchronized with the music-beats such that as many beats as possible are matched with as little as possible time-warping distortion to the video. We show two applications for this cross-media synchronization; one where a given dance performance is enhanced to be better synchronized with its original music, and one where a given dance video is automatically adapted to be synchronized with different music.
Source
DOI: 10.1007/s41095-018-0115-y
Comments
拍の瞬間を強調
URL
https://www.cs.tau.ac.il/~dcor/articles/2018/Dance-z.pdf
https://www.semanticscholar.org/paper/Dance-to-the-Beat-%3A-Enhancing-Dancing-Performance-Kleiman-Cohen-Or/c6909b6a9a3e11163927bf695841ee61a82c9fa1
Tag
#2018
#beat #rhythm #video #synchronization #cross-media #generation #automation
#拡張/Augmentation
#生成/Generation